Generalized Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework

نویسندگان

  • R. C. Venkatesan
  • Angel Plastino
چکیده

The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics subjected to the additive duality of generalized statistics (dual generalized K-Ld) is reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure theoretic framework. Specifically, it is demonstrated that the dual generalized K-Ld is a scaled Bregman divergence. The Pythagorean theorem is derived from the minimum discrimination information principle using the dual generalized K-Ld as the measure of uncertainty, with constraints defined by normal averages. The minimization of the dual generalized K-Ld with normal averages constraints is shown to possess distinctly unique features.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deformed Statistics Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework

The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics [constrained by the additive duality of generalized statistics (dual generalized K-Ld)] is here reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure-theoretic framework. Specifically, it is demonstrated that the dual generalized K-Ld is a scaled Bregman divergence....

متن کامل

Scaled Bregman divergences in a Tsallis scenario

There exist two different versions of the Kullback-Leibler divergence (K-Ld) in Tsallis statistics, namely the usual generalized K-Ld and the generalized Bregman K-Ld. Problems have been encountered in trying to reconcile them. A condition for consistency between these two generalized K-Ld-forms by recourse to the additive duality of Tsallis statistics is derived. It is also shown that the usua...

متن کامل

Proximity Function Minimization Using Multiple Bregman Projections, with Applications to Split Feasibility and Kullback-Leibler Distance Minimization

Problems in signal detection and image recovery can sometimes be formulated as a convex feasibility problem (CFP) of finding a vector in the intersection of a finite family of closed convex sets. Algorithms for this purpose typically employ orthogonal or generalized projections onto the individual convex sets. The simultaneous multiprojection algorithm of Censor and Elfving for solving the CFP,...

متن کامل

Submodular-Bregman and the Lovász-Bregman Divergences with Applications

We introduce a class of discrete divergences on sets (equivalently binary vectors) that we call the submodular-Bregman divergences. We consider two kinds, defined either from tight modular upper or tight modular lower bounds of a submodular function. We show that the properties of these divergences are analogous to the (standard continuous) Bregman divergence. We demonstrate how they generalize...

متن کامل

Metrics Defined by Bregman Divergences †

Bregman divergences are generalizations of the well known Kullback Leibler divergence. They are based on convex functions and have recently received great attention. We present a class of “squared root metrics” based on Bregman divergences. They can be regarded as natural generalization of Euclidean distance. We provide necessary and sufficient conditions for a convex function so that the squar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1102.1025  شماره 

صفحات  -

تاریخ انتشار 2011